Search Results

The problem with TDE and the challenge of T

I recently gave a SQL Supper talk as part of the Microsoft Future Decoded evening community events, and I made the point of not being impressed by Transparent Data Encryption (TDE), be it SQL Server, Azure SQL Database or Cosmos Db. I would like to explain why. The problem of TDE I have worked with data and storage engines for some time and therefore TDE seems straight-forward to me. I think a good overview of TDE for SQL Server, Azure SQL Database and Azure SQL Data Warehouse is given here , and I think a similarly good overview of TDE

Uploading Files To Data Lake Storage With PowerShell Part Three

Picking up from where we left off last month, we’re today we’re looking at setting the Azure Data Lake Storage account. This post is part of a series on automating the process of uploading files to Azure Data Lake Store , Although the entire script is available on Git (posted below) I’m going to go into one function per post so that I can go in greater depth. Part One of this blog series focused on logging in to an Azure Subscription. Part Two focused on setting the Resource Group. As mentioned, today’s function starts on row 74 and is

Uploading Files To Data Lake Store With PowerShell Part One

Hello!   I’ve recently been working on uploading files to Azure Data Lake Store . It’s quite straightforward and I think a decent introduction into automating a deployment with Azure, as well as a good example of writing scripts that are idempotent, so I’m going to go through them from beginning to end. I’m going to go into one function per day, so this will take 5 days to cover. But I’m hoping that by focusing a bit more in-depth as opposed to trying to cram it all into one post it will be more informative, and both yourselves and

Uploading Files To Data Lake Storage With PowerShell Part Two

Carrying on from our previous post on automating the process of uploading files to  Azure Data Lake Store , we will check if a Resource Group exists, and if it does not then it will create it. Although the entire script is available on Git (posted below) I’m going to go into one function per post so that I can go in greater depth. Part One of this blog series focused on logging in to an Azure Subscription. Today’s function starts on line 42 and is called Set-AzureResourceGroup. Before we go into it though, I want to take a moment

Improving Azure Functions throughput

I recently ran into an Azure Functions throughput problem which I logged on Stack Overflow as regular-throughput-troughs-in-azure-functions-requests-per-second . The product group were pretty quick to respond and pointed me to their Azure App Service Team Blog post processing-100000-events-per-second-on-azure-functions/ . The post lists five notable configuration choices: functions process [event hubs] messages in batches webJobs dashboard is disabled in favor of using Application Insights for monitoring and telemetry each event hub is configured with 100 partitions data is sent to the event hubs without partition keys events are serialized using protocol buffers Of these, the second and third are most interesting.

Getting started with Azure Policy

Recently as part of a data classification implementation certain aspects were implemented using Azure Policy, since is supports auditing of existing resources and can prevent non-compliant resources from being created in the first instance. Like most things, the documentation reads well and the samples seem useful, until one has to do something different and go off-piste. There are plenty of VM and tagging samples though not so many for Azure SQL Database. Therefore as Sabin is a data engineering consultancy, all examples given here will be Azure SQL Database focused and not the canonical VM as found in many examples.

Managing Azure Functions logging to Application Insights

The Azure Functions teams have made it incredibly easy to emit telemetry to Application Insights. It really is as easy as update the Function App’s settings as described by the App Insights wiki page over at Azure Functions on github. However, if you are on the basic pricing plan for Application Insights then the 32.3Mb daily allowance gets used up pretty quickly. The remainder of this post is about understanding the telemetry data sent to Application Insights by Azure Functions and how to configure the function app host.json to filter and reduce the volume of telemetry sent. The naïve approach

Overview of Azure Virtual Machine IO performance and throttles

In this post we are going to look at the IO performance of a Virtual Machine in Azure. We are specifically talking about the GS 4 machines with premium managed disks. The theory should apply to all classes of machine but some such as the L series have a different configuration for the temporary drive which is important. The data in this post has been gathered using a mixture of this excellent post https://blogs.technet.microsoft.com/xiangwu/2017/05/14/azure-vm-storage-performance-and-throttling-demystify/ and generating IO using diskspd and measuring using perfmon. Speed vs Throughput It is worth pointing out that Azure specifies the performance of the Disks /

Fix For Using Azure Active Directory and DacFX

Hello! As part of an SSDT project we have a contained user that authenticates against an Azure Active Directory group (read more on the CREATE USER page). However the account we are executing deployments with is the SQL Admin account on the Azure SQL Instance. And so we get this error - The executed script: CREATE USER [myUser] FOR EXTERNAL PROVIDER; ' Reason: '' At line:94 char:13 + Throw $toThrow + ~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (Deployment fail... ' Reason: '':String) [], RuntimeException + FullyQualifiedErrorId : Deployment failed: 'Could not deploy package. Error SQL72014: .Net SqlClient Data Provider: Msg 33159,

Migrating SSIS Packages to SSIS Azure

Hello! In case you missed the announcement (and there were a lot of announcements during MSIgnite), SQL Server Integration Services is in Public Preview on Azure! I’ve written about it elsewhere in greater depth , but here are the headlines: It makes use of SSIS Scale Out , which was released as part of SQL Server 2017 . Although it is based on SSIS Scale Out, you can’t actually configure SSIS Scale Out to run on the instance. If this confuses you then read my in-depth post. SSISDB is installed in either SQL Azure or on a Managed Instance. You

Do Azure SQL Database External Tables have a place in a micro-service?

I was recently in discussions on using External Tables to link Azure SQL Databases across micro-service boundaries. This has led to some challenging discussions with a client and unexpected opinions internally here at sabin.io . My simple view of a micro-service is of a data store fronted by code, which is in turn behind an API or message subscriber. Importantly only this code accesses the store. I have arrived at this opinion though many (often heated) discussions with developers implementing services, and though working with teams breaking large services into micro-services to clarify ownership and responsibility, remove dependencies and simplify

SSIS Package Execution In Azure Is Now Available

Well, it’s been some time coming but SSIS packages are the latest product to make the move from on premise to Azure. You can now take your SSIS projects and deploy them to the new Platform as a Service (PaaS) offering in Azure. The aim of the team at Microsoft was for users to take their current SSIS packages and just “lift and shift” these to Azure. So in development terms that means that there are minimum to no changes to be made in the solution at least. But before we get into the deployment and running of SSIS packages

Azure Powershell 4.0 may break your scripts

Ensuring backwards compatibility is something that one has to consider very carefully when doing continuous delivery. We are all to well aware of the challenges of this with database systems as, generally, the database lives much longer than the apps that interact with it and thus one has to maintain the data. SQL has far too many “legacy features” that can’t be changed due to potential breaking changes. Thankfully the SQL team now have a more robust way of managing change and that’s through the compatibility level for the database. This allows you to upgrade to the latest runtime but

Migrating SSIS Packages to SSIS Azure part Two – Automating the Deployment

Hello! If you’ve read and followed through my previous post, you will have World Wide Importers Integration Services project running in SSIS Azure. It’s all very interesting, go and have a read . One thing that is missing form that guide, the documentation, and SSIS in general, is how to automate SSIS Deployments. In the WWI SSIS project, there are connection managers that we had to manually update the values of to get it to work post-deploy. This is exactly the opposite of what we want to do. Back when SQL Server 2012 was known as Denali, one of the

Azure and Guest OS Families

Recently one of our clients came to us with an issue: the Azure SDK 2.7.8 was being retired and needed updating for their Azure Cloud Services. OK, simple enough, but the issue was that the latest versions of the Azure SDK require .NET 4.6.2 installed to work. You see, when you deploy a Cloud Service to Azure, it deploys it to a VM that is spun up as part of the deployment process. And by default, Windows Server 2012 R2 does not come with .NET 4.6.2 installed. So the problem was, how do you get 4.6.2 installed on that flavour

ARM Template Snippet - Allow Azure Resources To Access Azure SQL Instance With This One Simple Trick!

  Hello! I’m really getting into using ARM Templates to deploy Azure Resources. For those of you not ITK, and for the sake of brevity I’ll summarise in a sentence, ARM Templates are basically Infrastructure as Code. It’s all very interesting, go and have a read . In addition to having our code in source, the ability for us to deploy our entire infrastructure from source into Azure is fantastic, because now we can run one single script to build deploy the infrastructure and the code from source, and we can all have our own personal environment in Azure. And

Running SQL Server in an Azure Container Instance

Azure Container Instances are still in Preview and not officially available for Windows yet, which made me smile. It took me a while to figure out how to get this working so I thought I’d share what I’ve found. Containers are great for lightweight testing of code before deployment to production servers because they can be created so quickly and they give the same environment to test in very reliably. Now that Microsoft is offering container instances in Azure it means you don’t have to worry about provisioning and configuring your own docker host/cluster. The options for deploying SQL Server

Create an Azure Active Directory Application and Key using PowerShell

I’ve been a SQL developer for a good few years now, and have also developed numerous web applications, web services and various console apps. However, lately I find myself getting into the world of DevOps, Azure, and necessarily, PowerShell. Whilst familiar with PowerShell to a degree, I’ve learnt a lot over the past few weeks about the Azure PowerShell module, and how we can use it to script tasks that you might not want to do manually in the Azure portal if you’re thinking about automation. This post should help if you want to create an Azure Active Directory application

In a partitioned world, don’t violate core directive

This is another short post steming from a recent talk I gave on Azure Cosmos Db vs. SQL Database, and there will be more based on discussion and feedback I received and things I learnt along the way. The point I want to make is that when implementing a scale out data storage then regardless of whether you are considerng Azure SQL Database, Cosmos Db or another storage engine, you have to think differently about your read and write patterns. To paraphrase Conor Cunningham linkedin | blog from his excellent OLTP Sharding Techniques for Massive Scale presetation at SQL PASS

CosmosDb and CRUD with a S

In preparation for my talk on Azure SQL Database vs. CosmosDb - How do you choose   at giving a talk at sqlbits I spent a lot of time thinking about create, read, update and delete (CRUD) operations, and concluded that with the advent of CosmosDb and its multi-model abstraction layer, CRUD needs a S. CRUD as an acronym has served software engineering very well by encapsulating the core actions of data persistence and retrieval. In the transition from the “ye olde” closed and monolithic systems to the current open, loosely coupled and distributed micro-service architectures via client server and

Automatic Tuning Enabled By Default In Azure SQL Database Happens Today!

Hello! From January 15th (ie, this Monday) Automatic Tuning will be enabled by default and gradually rolled out to ALL Azure subscriptions. If you are the owner of a subscriber you would have received an email two weeks ago alerting you of this fact. However if you’re not, and this has not been communicated out to you, this may be something of a surprise. With regards to how the rollout impacts you, this blog post states that “All servers that do not have automatic tuning explicitly configured will inherit Azure defaults, making automatic tuning enabled. Similarly, all databases that do

Azure Automation Module Import: Sorry! You’re not my type.

I’ve just burnt several hours on an issue with importing powershell modules into an Azure automation account using the New-AzureRmAutomationModule cmdlet.  Although I could import the module with no issue using the portal, I was getting an error of “Import newer version failed” when trying to do it using the cmdlet, and when I checked the module in the portal the message was: “Error importing the module cAsosSQLAddons. Import failed with the following error: Orchestrator.Shared.AsyncModuleImport.ModuleImportException: No content was read from the supplied ContentLink. [ContentLink.Uri=https://xxxxx.blob.core.windows.net/dscresources/DSCResources .cXXXXSQLAddons/1.0.9/cXXXXSQLAddons.zip]” When importing a module using New-AzureRmAutomationModule , one of the parameters you must supply is

CosmosDb, know your costs, and remember…

This will be a short post to emphasize a simple point, yet one that should make an enormous difference to how you approach configuring a CosmosDb collection and modelling documents to support read and write requirements. Know your costs I cannot emphasize this point enough. The folks at Microsoft have made this really easy, be it via the Request Units (RU) and Data Storage calculator , the collection Query Explorer through the Azure Portal or a REST client such as Postman coupled with the really useful library and samples by a Microsoftie over on git documentdb postman collection . Let’s

CosmosDb, know your partition costs, well more or less

In my previous post Cosmos Db know your costs, and remember I made the point that by understanding RU costs early, you can make informed decisions in relation to document design and application CRUD and query operations. While it is easy and most certainly useful to arrive at a projected RU cost, using for example, the Request Units (RU) and Data Storage calculator or directly against a fixed 10GB collection via the Azure Portal (incidentally the same costs), the problem is these do not highlight RU costs when partitioning is required to support scale-out . Now if you know your

Why is Sqlpackage Using All The Build Server Memory?

Sqlpackage can be particularly resource-intensive when scripting a database that has a considerable amount of objects. In this post I'm going to discuss the options available when scripting out a database deployment file from a dacpac when using sqlpackage.exe. I'm also going to investigate how resource intensive they are and what we can do to limit the hardware resources used and how much of an impact this has on our waiting times, with some interesting results on where we were taking the performance hit. Recently I've noticed that when we have more than one build running at the same time

VSTS Hosted Build Specs: The Script

Some months back, I published a post about the VSTS Hosted Build Agent’s specs. One thing I didn’t add was the PowerShell script that I used to get these details. Mainly because I couldn’t find the script anymore… So by popular demand here is the script I used to get the build specs. I ran it as an in-line PowerShell script as part of a build that was being run on the Hosted Build Agent.     Here is the output from the script:     The CPU has changed since I last gathered the data about this: previously it

SSDT 16.5 Released

Hello! Recently the SQL Tools Team released a new version of both SQL Server Data Tools ( SSDT ) and SQL Server Management Studio (SSMS.)  There’s a range of bug fixes, but two new features that I am particualrly interested in. Firstly, SQLPAckage.exe and the DacFx API can now generate deployment report, deployment script and publish to a datbase all in one action. Neat! This is useful because it’s important to keep track of exactly what has changed on a database. Of course there’s nothing stopping you right now from creating executin these options in three separate actions, but there

Assist Deploy Is Available on GitHub

Hello! For some time now I have been working on automating SSIS deployments, and earlier this week I published my efforts on GitHub . But before I get into the what/how, let’s focus on the why and let me catch you up on how I got here… The task to take an ispac and deploy in and of itself is quite a straightforward process as there are multiple ways to do this . For those of you who want the abridged version of the linked post, the choices are as follows: Integration Services Deploy Wizard SSIS Catalog T-SQL API PowerShell

SSDT 16.5 Released part 2: Using the DacFx API and Samples!

Hello! Yesterday I posted about the new release of SSDT from the SQL Tools Team at Microsoft. Two of the big changes are the ability to create the deployment report, deployment script and execute the deployment all in one command. The other change is that now for Azure two scripts are generated: one for any changes that need a connection to master, and the other script for changes to the user database. The samples yesterday showed how to execute the new method using SQLPackage, but a lot of people, myself included, have automated the deployment using the DacFx API through

Using a Cloud Witness for Clusters

On a client site recently a question was asked about the file share witness in a SQL Server failover cluster on-premise, and where to put it if you only have 2 sites. As always, it depends! Let’s look at some scenarios. Bear in mind that use of the Azure Cloud Witness requires Windows Server 2016 or later. Topology 1 Three node cluster with 2 nodes at primary and 1 at disaster recovery (DR). Most people want high availability at their “primary site” and are happy to have standalone capability at the business continuity (DR) site. To save storage costs, I

PowerQuery – The power of M

I love PowerBI, actually I love PowerQuery. It's a great way to combine data from around your business, across the web from anywhere. What I really like is very little is done automatically, i.e. it doesn't do the nice data type detection you get with Excel that screws your imports if the data type in a column differs from the first few rows to the rest of the file. Does that make it difficult to use. No its not. The nice thing is that its very easy to add additional columns, change data types, use expressions, combine datasets, and do

How to move a replication subscriber to a new server with no downtime to the publisher?

In a recent data centre migration for a client we had a problem where we needed to move a subscriber to a new data centre without incurring any downtime to the publisher or loss of data after the subscription migration. The application was sending hundreds of transactions per second to the publisher. An additional complication was an upgrade to SQL Server 2016 from SQL Server 2008 R2 on the subscriber. The first phase of the migration was to move the subscriber to a new server in a different domain, but without incurring any downtime to the publishing application. How to

SqlServer PowerShell Modules NuGet Package Now Available

Hello! Back in the July Update of SSMS 2016 , a bunch of new SQL PowerShell functions were added, plus two neat additions to Invoke-Sqlcmd : -outputas, which allowed you to output the result set into a data object (eg, data row, data table etc), and -ConnectionString, which allows you to pass in a connection string instead of using the pre-defined parameters. All very useful stuff, go and have a read . However, this update has two issues: firstly, it's not updating the classic sqlps module, but rather has created a new module: sqlserver. This new module will be regularly

SqlServer PowerShell Modules NuGet Package Now Available

Hello! Back in the July Update of SSMS 2016 , a bunch of new SQL PowerShell functions were added, plus two neat additions to Invoke-Sqlcmd : -outputas, which allowed you to output the result set into a data object (eg, data row, data table etc), and -ConnectionString, which allows you to pass in a connection string instead of using the pre-defined parameters. All very useful stuff, go and have a read . However, this update has two issues: firstly, it's not updating the classic sqlps module, but rather has created a new module: sqlserver. This new module will be regularly

SQL Supper Scripts

Hello! Thanks to everyone who turned up yesterday at SQL Supper: there was a good turnout of both new and familiar faces. The Demo Gods were with me and I was able to log on to both my Azure VM and able to deploy to SQL Azure. I’ve uploaded the scripts to gist and shared below. I also spoke about raising a Connect Issue so that Microsoft.Build.Utilities.Core NuGet package will work with Microsoft.Data.Tools.Msbuild. I’d like to see this so that we do not have to install the Microsoft Build Tools 2015 MSI on the box. And this is important because

Home

Plan Develop Deploy Measure Sabin.io is a Data Engineering practice. Our focus is on helping companies deliver data systems. We help companies build sustainable applications that consider the support and ongoing development. The use of agile methodologies is fantastic but needs to be aligned with the engineering practices of testing, continuous integration/delivery and a feedback loop to enable continual improvement. Our experience stretches from windows server management and automation through to the support and management of BI systems. We have strong application development experience which means we are able to deliver and support solutions that work with all those involved

Where To Find Us at SQLBits

Hello! SQLBits is back! This year SQLBits is being hosted in the Grand Hallway at the Olympia which was opened way back in 1886. This marks the return of Bits to London, and in fact the south of England, for the first time since 2015. Back then it was hosted at the Excel Exhibition Centre in the East of London. If you have never heard of SQLBits before, I’d be very surprised as it is the largest SQL Server Conference in Europe and offers world class training. But enough about the brief history lesson, let’s talk about what’s coming up

SSDTPokedex: Migrating a Database Into SSDT

Hello!   If you want to have the best chance of something being successful, you have to be committed to it from the start. That’s a pretty fatuous sounding statement, almost as bad as “to make something better you have to do more of the good stuff and less of the bad stuff”, so let me contextualise: if you desire to have good testing coverage on an application, then you need to be serious about testing form the first day you write code for the application. Be it manual or automated testing, you need to put the effort in terms

PowerShell Workflow Script To Stop VM’s In A Resource Group

Recently I needed to make sure that all the VM’s in a given resource group were stopped, and so I looked around the Runbooks available to download from the Azure Marketplace. Some of these were ridiculously complex: one was over 500 lines long! Just to stop a VM! Naturally there is a need to setup: we need to get the names of the VM’s in the Resource Group and, if they are running, then stop them. However the command to stop a VM is straightforward: “Stop-AzureRmVM”, followed by the name of  the VM and the resource group. Quite frankly I’m

Using CloneDatabase

One of the most interesting features of SQL Server 2014 Service Pack 2 is the new management command DBCC CLONEDATABASE. The idea of it is to create an "empty" copy of the database; all the metadata and statistics of the original and clone are identical, but the clone contains no data. So I ran this on a very small (50mb) database, and within a few seconds it was completed. I then ran this on AdventureWorks2014, which is 250mb, and the time to clone was roughly the same. Your mileage may vary. The console window provides some information on what is

T SQL Tuesday: Shipping Database Changes with SSDT

Hello! Let’s see how this goes: this months subject for T SQL Tuesday is about shipping database changes, something we here are all familiar with. So I thought I’d make some notes about a tool I’m very familiar with, SQL Server Data Tools. The Good It’s free! SSDT works with Visual Studio Community up to Ultimate, and from Visual Studio 2015 onwards it comes with it’s own Visual Studio IDE. SSDT Has a NuGet package available. So you don’t need to install Visual Studio to get builds running, and crucially can control which version is used to compile at a

SQL Server Container Performance

Is SQL Server in a container faster than a VM? I briefly looked at SQL Server containers when Windows Server 2016 was released. Containers offer the ability for rapid provisioning, and denser utilization of hardware because the container shares the base OS’s kernel. There is not a need for a Hyper-Visor layer in between. As a recap for those that are not up speed with containers, the traditional architecture of databases in a VM is like so: The Hyper-Visor OS is installed onto the host hardware, a physical server in the data centre. Many VMs are created on the Hyper-Visor

Why is SSDT Always Rebuilding My Constraints?

Hello! Let me begin by saying that I’m a big fan of SSDT. It’s free, it works with all flavours of Visual Studio, the team do their very best to keep up-to-date with features that are released with increasing regularity by Microsfot in both Azure and SQl Server on-prem. I’ve met the team a few times, and they’re genuinely keen to engage with the users of SSDT in how it can be better, and how it can be extensible. So, SSDT is a great tool. I said great, but not perfect. It has it's limitations, the same as any tool.

VSTS Hosted Build Agent Specs

I was interested to know just what the hardware specifications of the hosted build agent is. So I added some PowerShell to read out the info below: 2016-06-29T09:23:31.3935358Z systemname      Name                                      DeviceID NumberOfCores NumberOfLogicalProcessors Addresswidth 2016-06-29T09:23:31.3935358Z ----------      ----                                      -------- ------------- ------------------------- ------------ 2016-06-29T09:23:31.3935358Z TASKAGENT5-0010 Intel(R) Xeon(R) CPU E5-2673 v3 @ 2.40GHz CPU0                 2                         2           64 2016-06-29T09:23:31.4095356Z Total memory:  7167.55078125 What piqued my interest greater was that this is the exact same spec for a D2 v2 box that is available via Azure. Clearly, Microsoft have a build agent template which is built, stored in a pool, and provisioned whenever a build

Webinar: 8th Feb 2018

Preventing SQL Server Performance Problems Before They Hit Production Join Mark Allison, Paul Anderton and Kevin Kline at 3pm UTC on 8th Feb 2018 Wouldn’t it be great if performance problems in your SQL estate could be detected BEFORE they reach your production databases? In this demo-centred webinar we will review: How to detect and prevent releases of code that could reduce performance of your SQL Server database Ways to prevent the most common performance problems before they reach production: missing indexes, deadlocks and excessive key lookups How effective SentryOne can be in a DevOps pipeline both on-premise and in

VMWare network performance bug - Getting a repro

If you’ve read my previous post about an issue with VMware ESX 6 and connecting to SQL and 500ms latency , you might be interested in the process we went through to get to the repro. Getting a repro (being able to reproduce a bug/feature) is often a complex and time consuming task. The challenge is like being Sherlock Holmes and using your experience to focus on the aspects of the situation that is important. The challenge is that without a repro, You can’t give anything to a supplier to enable them to triage and find a fix for it

Automating SQL Server Performance Testing

You run performance tests as well as functional tests when deploying new code changes to SQL Server, right? Not many people do, I think you should, and this article will show you how to do it by harnessing an existing performance tool, rather than writing your own monitoring infrastructure from scratch. Any good performance monitoring tool that records information to a database will do fine, and we prefer to use Sentry One . Here are the steps to accomplish this. Create a baseline database When you release your database change, you want to have something to compare against as an

Feedback requests to Microsoft

If you didn’t know Microsoft has a number of channels to provide feedback. Most historically user connect (connect.microsoft.com), it integrated with their internal bug tracking systems and meant that items flowed from the users to engineering and back. Well supposed to.   The SQL product group still use connect https://connect.microsoft.com/sql with a few teams also using Trello https://trello.com/b/NEerYXUU/powershell-sql-client-tools-sqlps-ssms and or Slack Slack - sqlcommunity.slack.com Visual studio is moving to https://developercommunity.visualstudio.com/spaces/8/index.html from connect and also has https://visualstudio.uservoice.com/forums/121579-visual-studio-ide for ideas VSTS has a great support and also uses MSDN, and takes requests on Uservoice https://visualstudio.uservoice.com/forums/330519-team-services PowerBI has forums and uses user voice

Log Shipping: It's Better Than Bad It's Good!

I'm probably showing my age by quoting an old Ren and Stimpy cartoon here, but to be fair it probably sums up log shipping pretty well. This post is focusing on using a read-only log shipping database for reporting purposes, and the limitations of read-only log shipped databases. I also share some monitoring scripts and a few ideas on how to improve restore performance without having to upgrade the hardware/software. Despite the development of AlwaysOn in recent releases of SQL Server, log shipping is still a great way to set up a copy of databases to be used for reporting.

How To Compile SQLProj Files Using Cmdline MSBuild... Errors Included!

I recently needed to build and deploy about 40 small database projects that were in 4 or 5 different database solutions. And I needed to do this several times a day, so compiling via Visual Studio would be a boring and tedious process. So to speed up the process I decided to write the build process in an MSBuild target file and call initiate the build process through PowerShell. The targets file was simple enough to put together. This would be saved in the root location of all the solution folders as "BuildAllDBProjects.targets.xml". Then the PowerShell would be simple enough;

Notes From The Field: Using Invoke-Sqlcmd

Lately I’ve been working quite a bit with Invoke-Sqlcmd and there’s a few issues with how it handles errors that I feel make it a poor choice of tool to connect and execute SQL. Let’s take a look at a script that will return an expected error:   Funnily enough, this does not return an error. The “$?” is LASTEXITCODE, which means that as it returned as “True” (ie no issues) so PowerShell considers the query to be a success. This is a real problem. Even if we add error handling to the script, we still see the same result.

Don’t use Storage Spaces…

… if you care about performance in the slightest bit. That’s it really. You don’t need to read any further. What are storage spaces? Have a read of this quick overview: https://www.windowscentral.com/how-use-storage-spaces-windows-10 I had some spare computer parts laying around so I thought I’d rebuild my Windows 10 desktop at home. I have 4 x 4TB Hitachi SATA drives and a hardware RAID controller spare so decided to put them in my desktop. I had heard of storage spaces and wanted to try it out to see how performance would be considering there was no extra hardware involved in creating

SSDTPokedex: Integrating Slack and VSTS Into GitHub Repo - An Infinite Improvement

Hello! One of the home projects I’m currently working on is migrating a database over from SQLite to SQL Server. There’s several tasks that need to be accomplished before we can say that this is successful. Broadly speaking they fit into the key pillars of successful software development: Plan Develop Deploy Measure   So the development part is well under way: there is a repo in GitHub of an SSDT solution that will compile locally (it works on my machine anyway.) Plus I have a couple of releases: one intentionally broken and one fixed. Now there’s plenty of tasks I

How to install SQL Server on Windows Server Core?

As part of automation of database and application deployments, it makes sense to be able to create new SQL Server instances quickly and with minimal resources. I have already explored containers and written about it on this blog, but I’d like to turn your attention to setting up SQL Server on Windows Server Core for those of you that run SQL Server on-premise or within VMs in the cloud. In a domain environment it should be pretty simple to just create a PowerShell session to your target Windows Server where your account is a local administrator and then simply run